1,162 research outputs found

    Conservation Audit and Stewardship Plan

    Get PDF
    The Town of North Hampton, NH contains significant portions of two river drainages - the Little River which flows directly into the Atlantic Ocean, and the Winnicut River which flows northward into the Great Bay estuary. Situated as it is in the watersheds of these two rivers, land use in the Town has a significant effect on the quality of the water bodies into which these rivers flow. Preservation of open space, especially in areas in the vicinity of these rivers and their associated wetlands has been identified as a priority by the Town of North Hampton, and by regional studies including the 2006 State of the Estuaries Report by the NH Estuaries Project among others. Acknowledging their strategic location regarding these resources, the Town Conservation Commission has made strides over the past three decades to preserve open space by a variety of means

    SHADOWS OF EMPIRE: THE DISPLACED NEW WORLD OF ANTONY AND CLEOPATRA

    Get PDF

    Conservation Lands Audit and Online Inventory for Dover, NH

    Get PDF
    This project developed a functional inventory of the numerous conservation lands located throughout the City of Dover. The City’s inventory of conservation lands was disjointed and clearly in need of organization. A consultant was hired to conduct the research necessary to produce a complete, up to date and easily accessible “functional Inventory” that is readily accessible to municipal volunteers and the public through the City website

    Loopholes And Financial Innovation

    Get PDF
    The loophole phenomenon occurs in financial services, bureaucratically controlled marketplaces, as well as in our everyday lives.  Whenever an entity writes a rule, that entity simultaneously creates many loopholes (Kane, 1977).  Loopholes provide a way to pervert regulations and may even stimulate innovation. Different parties search out loopholes to adapt to changes as processes evolve over time. We discuss how various parties have reacted to specific loopholes. Our discussion seeks to provide a better understanding of the evolutionary interplay between financial innovation and regulation as well as to provide instruction on how to identify and exploit loopholes

    High-level Cryptographic Abstractions

    Full text link
    The interfaces exposed by commonly used cryptographic libraries are clumsy, complicated, and assume an understanding of cryptographic algorithms. The challenge is to design high-level abstractions that require minimum knowledge and effort to use while also allowing maximum control when needed. This paper proposes such high-level abstractions consisting of simple cryptographic primitives and full declarative configuration. These abstractions can be implemented on top of any cryptographic library in any language. We have implemented these abstractions in Python, and used them to write a wide variety of well-known security protocols, including Signal, Kerberos, and TLS. We show that programs using our abstractions are much smaller and easier to write than using low-level libraries, where size of security protocols implemented is reduced by about a third on average. We show our implementation incurs a small overhead, less than 5 microseconds for shared key operations and less than 341 microseconds (< 1%) for public key operations. We also show our abstractions are safe against main types of cryptographic misuse reported in the literature

    Characterisation of novel biomass degradation enzymes from the genome of <i>Cellulomonas fimi</i>

    Get PDF
    Recent analyses of genome sequences belonging to cellulolytic bacteria have revealed many genes potentially coding for cellulosic biomass degradation enzymes. Annotation of these genes however, is based on few biochemically characterised examples. Here we present a simple strategy based on BioBricks for the rapid screening of candidate genes expressed in Escherichia coli. As proof of principle we identified over 70 putative biomass degrading genes from bacterium Cellulomonas fimi, expressing a subset of these in BioBrick format. Six novel genes showed activity in E. coli. Four interesting enzymes were characterised further. α-l-arabinofuranosidase AfsB, β-xylosidases BxyF and BxyH and multi-functional β-cellobiosidase/xylosidase XynF were partially purified to determine their optimum pH, temperature and kinetic parameters. One of these enzymes, BxyH, was unexpectedly found to be highly active at strong alkaline pH and at temperatures as high as 100 °C. This report demonstrates a simple method of quickly screening and characterising putative genes as BioBricks

    A Family History of Lethal Prostate Cancer and Risk of Aggressive Prostate Cancer in Patients Undergoing Radical Prostatectomy.

    Get PDF
    We investigated whether a family history of lethal prostate cancer (PCa) was associated with high-risk disease or biochemical recurrence in patients undergoing radical prostatectomy. A cohort of radical prostatectomy patients was stratified into men with no family history of PCa (NFH); a first-degree relative with PCa (FH); and those with a first-degree relative who had died of PCa (FHD). Demographic, operative and pathologic outcomes were analyzed. Freedom from biochemical recurrence was examined using Kaplan-Meier log rank. A multivariate Cox logistic regression analysis was also performed. We analyzed 471 men who underwent radical prostatectomy at our institution with known family history. The three groups had: 355 patients (75%) in NFH; 97 patients (21%) in FH; and 19 patients (4%) in FHD. The prevalence of a Gleason score ≥8, higher pathologic T stage, and biochemical recurrence (BCR) rates did not significantly differ between groups. On Kaplan-Meier analysis there were no differences in short-term BCR rates (p = 0.212). In this cohort of patients undergoing radical prostatectomy, those with first-degree relatives who died of PCa did not have an increased likelihood of high-risk or aggressive PCa or shorter-term risk of BCR than those who did not

    Who Benefits from KIPP?

    Get PDF
    The nation's largest charter management organization is the Knowledge is Power Program (KIPP). KIPP schools are emblematic of the No Excuses approach to public education, a highly standardized and widely replicated charter model that features a long school day, an extended school year, selective teacher hiring, strict behavior norms, and a focus on traditional reading and math skills. No Excuses charter schools are sometimes said to focus on relatively motivated high achievers at the expense of students who are most diffiult to teach, including limited English proficiency (LEP) and special education (SPED) students, as well as students with low baseline achievement levels. We use applicant lotteries to evaluate the impact of KIPP Academy Lynn, a KIPP school in Lynn, Massachusetts that typifies the KIPP approach. Our analysis focuses on special needs students that may be underserved. The results show average achievement gains of 0.36 standard deviations in math and 0.12 standard deviations in reading for each year spent at KIPP Lynn, with the largest gains coming from the LEP, SPED, and low-achievement groups. The average reading gains are driven almost completely by SPED and LEP students, whose reading scores rise by roughly 0.35 standard deviations for each year spent at KIPP Lynn.human capital, charter schools, achievement

    Nearly-optimal state preparation for quantum simulations of lattice gauge theories

    Full text link
    We present several improvements to the recently developed ground state preparation algorithm based on the Quantum Eigenvalue Transformation for Unitary Matrices (QETU), apply this algorithm to a lattice formulation of U(1) gauge theory in 2+1D, as well as propose a novel application of QETU, a highly efficient preparation of Gaussian distributions. The QETU technique has been originally proposed as an algorithm for nearly-optimal ground state preparation and ground state energy estimation on early fault-tolerant devices. It uses the time-evolution input model, which can potentially overcome the large overall prefactor in the asymptotic gate cost arising in similar algorithms based on the Hamiltonian input model. We present modifications to the original QETU algorithm that significantly reduce the cost for the cases of both exact and Trotterized implementation of the time evolution circuit. We use QETU to prepare the ground state of a U(1) lattice gauge theory in 2 spatial dimensions, explore the dependence of computational resources on the desired precision and system parameters, and discuss the applicability of our results to general lattice gauge theories. We also demonstrate how the QETU technique can be utilized for preparing Gaussian distributions and wave packets in a way which outperforms existing algorithms for as little as nq>23n_q > 2-3 qubits.Comment: 29 pages, 4 appendices, 19 figure
    corecore